Data Analysis, Interpretation, and Display

Overarching Question: Is the Guided Reading program at L. B. Primary School effective?

Sub Question 1: Is there fidelity of implementation for the Guided Reading program according to the guidelines established by Fountas and Pinnell (1996).

Sub Question 2: Are students who participate in the program demonstrating improvements in reading skills over the school year?

Finding 1

The Guided Reading Program is not being implemented with fidelity, according to the guidelines established by Fountas and Pinnell (1996).

Evidence. Evidence for this finding comes from both the teacher and administrator interviews, as well as the student focus groups. The chart below (Finding One Evidence) displays the various sources of data that converge to support our finding. A summary of the data analysis that supports this finding is presented Figure 2 (Finding 1), below.



Figure 2. Schematic of the evidence supporting Finding 1. Multiple sources of evidence converged to support the finding.

Teacher and administrator interviews: Data analysis summary.The administrator and Guided Reading teachers of grades 1, 2, and 3 were all interviewed by an evaluator utilizing the same eight questions about the L. B. Primary School Guided Reading program. All stakeholders who were interviewed appeared relaxed and comfortable in answering the evaluator’s questions. Through the data analysis process, the evaluation team identified five dominant themes – infidelity of program delivery, training, time constraints, student engagement, and student progress. Table 4, below, displays the content analysis of the data:



Table 4. Summary of key evidence gathered through administrator and teacher interviews.

Teacher and administrator interviews interpretation.Based on the evidence presented above, we conclude that the GR program at L. B. Primary School is not being implemented with fidelity, according to the guidelines established by Fountas and Pinnell (1996).

Focus groups analysis summary.

In our focus group sessions, a number of students described GR as time when they took turns reading aloud from books, or engaged in what is often called ‘round robin’ reading. This contradicts the program requirements, as described by Fountas and Pinnell (1996) that each student read books on their own, at their own pace. (Please see pages 21 & 22 for charts displaying the content analysis of the data from the focus groups.)

Focus groups interpretation. The evidence presented above provides further support for our conclusion that the GR program at L. B. Primary School is not being implemented with fidelity, according to the guidelines established by Fountas and Pinnell (1996).

Finding 2

Some students who participated in the Guided Reading program demonstrated improvement in reading skills over the year and some did not.

Evidence.

Evidence for this finding comes from DRA results, DIBELS results, teacher and administrator interviews, as well as the student focus groups. The chart below (Figure 3: Finding 2 Evidence) displays the various sources of data that converge to support our finding. A summary of the evidence that supports this finding is presented following the chart.



Figure 3. Schematic of the evidence supporting Finding 2. Multiple sources of evidence converged to support the finding.

DRA reading level results analysis. The plot below (Figure 4: Grade 1-3 DRA Reading Level Scores) displays student growth in reading level scores from September to June. Although some improvement is shown, the gains are gradual and very little progress is seen between the March and June assessment periods. The two upper outliers in figure 4 below represent students whose reading skills were strong at beginning of the year and who maintained their strengths over the course of the year, progressing from reading level 34 to 38.

Some students’ DRA reading level scores improved over the year and some students’ scores did not improve. Most students (42 out of 48) were reading at a higher level at the end of the year, as compared to the beginning of the year. Five students were reading at the same level at the end of the year as compared to the beginning of the year, one student was reading at a lower level, and 19 students’ reading levels increased by one (11 students) or two (8 students) levels. One student in grade 1 met the benchmark for end of year reading, one student in grade 2 met the benchmark, and one student in grade 3 met the benchmark (grade 1 - level 16, grade 2 - level 28, grade 3 - level 38).

As noted, a pattern emerged across all grades: the majority of students did not improve their reading levels between the March and June assessments (14 out of 48 students demonstrated growth in their reading level between March and June; of the remaining 34 students, 4 students dropped back a level, and 30 students read at the same level.)

Grade 1 - 3 DRA Reading Level Scores






Labels
Sept.
Nov.
March
June
Min
0
2
2
2
Q1
6
8
10
10
Median
12
12
14
14
Q3
16
18
20
20
Max
34
34
38
38
IQR
10
10
10
10
Upper Outliers
2
2
2
1
Lower Outliers
0
0
0
0

||









Figure 4. Schematic of the progression of reading level scores for students in grades 1-3 over the course of the school year. Data indicate that scores have gradually improved, with little improvement between March and June.



DRA comprehension, fluency, phonics, phonological awareness, and vocabulary analysis. In all of the DRA literacy assessment measures (Comprehension, Fluency, Phonics, Phonological Awareness, and Vocabulary, portrayed in Figure 5, below), three patterns emerged: gradual improvement of scores over the school year, scores that do not meet the school’s established benchmarks, and a trend for students to show little or no improvement between March and June. The upper outlier in the June Vocabulary scores (Figure 5) and September Comprehension scores (Figure 6) represent one student in grade 3 who had demonstrated steady growth in reading and comprehension, as well as solid vocabulary knowledge over the course of the school year.



Figure 5. Graphical representation of student scores on DRA vocabulary assessments over the course of the school year. Results from March to June show little, if any improvements.





Figure 6. Graphical representation of changes in student DRA comprehension scores from September to June. Results indicate that comprehension scores have remained close to the same values over the course of the school year.










Figure 7. Schematic of student scores on DRA Oral Reading Fluency assessments over the course of the school year. Slight improvements have been noted with little if any growth demonstrated between March and June assessments.


Figure 8. Schematic depicting student scores on DRA phonics assessments over the course of the school year. Steady improvements were noted from September to March, then little, if any improvements between March and June.



Figure 9. Graphical representation of student scores on DRA phonological awareness assessments over the course of the school year. Steady improvements are noted between September and March, with fewer improvements between March and June.

DRA results interpretation. On the basis of the assessment data gathered through the Developmental Reading Assessment (presented above), the Guided Reading program at L. B. Primary School is not very effective in promoting student growth in reading skills; some students are demonstrating improvements in reading skills over the school year and some are not.

DIBELS results analysis. The plot below (Figure 10: DIBELS Grades 1, 2, 3 Composite Scores) displays student growth in the indicators from September to June. Only two students in the school (2 out of 48) reached the benchmark level where they are considered to be achieving their appropriate reading outcomes. The majority of the students (46 out of 48) are considered at a high risk of not achieving their reading outcomes without additional support and intervention. One student in grade 1 met the benchmark for end of year indicators, one student in grade 2 met the benchmark, and none of the students in grade 3 met the benchmark. The upper outlier in the plot below (Figure 10) represents a student who transferred in from another school division, which may account for his superior performance. The two lower outliers represent a brother and sister who are recent immigrants from Outer Mongolia and have very little English, which may account for their poorer performance.

DIBELS letter naming, phoneme segmentation, nonsense word fluency, oral reading fluency, retell fluency, and word use fluency results analysis. In all of the DIBELS assessment measures (Letter Naming, Phoneme Segmentation, Nonsense Word Fluency, Oral Reading Fluency, Retell Fluency, and Word Use Fluency), three patterns emerged: gradual improvement of scores over the school year, scores that do not meet the established benchmarks (see Appendix), a trend for most of the students to show little improvement, and a few who show no improvement between March and June.

The upper outlier in Figure 11: Grade 1 DIBELS – Letter Naming Scores, represents one student whose score is significantly above average score for the months of September and October. For the month of November, his score was within Q3 limits. The outlier in Figure 12: Grade 1 DIBELS Phoneme Segmentation Scores, represents one student whose score is consistently above the upper whisker for every month that DIBELS was administered. The outliers in Figure 13: Grade 1, 2 DIBELS Nonsense Word Fluency Scores, represent two individuals. One is below the lower whisker for the months of February and March, while the other student was below the lower whisker for the months of May and June. Both of these students had erratic attendance from January to June. The outliers in Figure 14: Grade 1, 2, 3 DIBELS Retell Fluency Scores, represent three individuals. For the months of September, October, and March one individual’s scores are above the upper whisker while for the months of November, January and February another individual’s scores are above the upper whisker. One individual’s scores are below the lower whisker for the months of January, February and March. There is only one outlier in Figure 15: Grade 1, 2, 3 DIBELS Oral Reading Fluency Scores for the month of May. In Figure 16: Grade 1, 2, 3 DIBELS Word Use Fluency Scores, the outliers are represented by two different individuals. One student’s scores are above the upper whisker for the months of September, October, November, January and February. The other student’s scores are below the lower whisker for the months of October and January.



Figure 10. Graphical representation of composite DIBELS scores for grades 1, 2 and 3. Data indicate the scores have improved over the three years.



Figure 11. Graphical representation of student scores on DIBELS letter naming score for grade 1 over the course of the school year. Results show very little improvement from September to November.





Figure 12. Graphical representation of student scores on DIBELS Phoneme Segmentation for grade 1 over the course of the school year. Results show some improvement over the school year with the whiskers expanding to a wider range as the year progresses.









Figure 13. Graphical representation of student scores on DIBELS Nonsense Word Fluency for grades 1 and 2 over the course of the school year. Steady improvements are noted between September and March, with less remarkable improvements between March and June.




Figure 14. Graphical representation of student scores on DIBELS Retell Fluency for grades 1, 2, and 3 over the course of the school year. Slight improvements are noted between September and March, with less remarkable improvements between March and June.









Figure 15. Graphical representation of student scores on DIBELS Oral Reading Fluency for grades 1, 2, and 3 over the course of the school year. Slight improvements are noted between September and March, with less remarkable improvements between March and June.





Figure 16. Graphical representation of student scores on DIBELS Word Use Fluency for grades 1, 2, and 3. Results show some improvement over the school year with the whiskers expanding to a wider range as the year progresses.

DIBELS results interpretation. On the basis of the assessment data gathered through the DIBELS (presented above), the Guided Reading program at L. B. Primary School is not very effective in promoting student growth in reading skills.

Teacher and administrator interview analysis. There were mixed opinions expressed through the teacher and administrator interviews. Some felt that GR was having a positive effect on student reading skills, serving to develop fluency, comprehension, and independence, whereas others expressed concern regarding some students’ lack of progress. Teachers felt that students were often not engaged in GR lessons, nor were they very excited about their book selections. Teachers also expressed concern over the school’s collection of GR books, noting that there was no longer a wide variety and copies of books had been lost.

As the end of the school year approaches, teachers and the administrator worried that students would not meet the end of year benchmarks established by the Board. Nevertheless, it was also noted that teachers felt some students were progressing well with their reading skills and that the form of GR offered served to meet their needs. (See page 3 for a chart displaying the content analysis of the data from the Teacher and Administrator Interviews.)

Teacher and administrator interview interpretation. The above data from the teacher and administrator interviews provides further evidence to support our finding that some students who participate in GR demonstrate improvements in their reading skills over the school year, and some students do not.

Focus group data summary/display. Two researchers conducted three separate focus groups, one from Grade 1, one from Grade 2, and one from Grade 3. One researcher took the role of the moderator while the other was the recorder. The students that were selected from each class were selected by a random sample.

The researchers welcomed each group of students into the conference room and started each session with introductions, small talk, cookies, and juice in order to make the students feel at ease with the moderators. The students appeared to be visibly comfortable with their peers and so in each instance, the students relaxed into a dialogue that was authentic.

The moderator structured the conversation around the following three questions:

  • Do you read in places other than at school?
  • What do you learn about in your guided reading sessions?
  • What helps your reading improve most at school?

Focus group content analysis. The majority of students were lively and involved in the discussions that emanated from the guiding questions. The focus groups provided the researchers with much data to analyze while searching for trends. The conversations illuminated recurrent instances in all three grade level focus groups. The data collected clusters around three themes – enjoyment of reading, guided reading, and reading improvement. The following tables (5 & 6) display the content analysis of the data:



Table 5. Content analysis summary of data from student focus groups: enjoyment of reading and guided reading.



Table 6. Content analysis summary of data from student focus groups: student perceptions of themselves as readers.

Focus groups interpretation.

Enjoyment of reading. It is clear from the focus group discussions that reading in L. B. Primary School is not seen as something enjoyable. Students enjoy reading outside of school and having the opportunity to select reading material that appeals to their interests. The students describe the leveled books that are used for guided reading in LB Primary School as “old.” Students are aware of the leveling of the books as denoted by the letters found on the cover. However, they are generally not clear at to whether they are reading at grade level or not.

Guided reading. The focus group data indicates that there is a lack of clarity around what exactly guided reading is and what its purpose should be. Students were aware of and had participated in some small group reading. Reading in these small groups was done primarily by taking turns (‘round robin’ style). Many felt it was a competition to see who could finish first. The quality and content of the books was held to be poor.

Reading improvement. Students were unclear at to how reading at school should be improving their own reading skills. Students had difficulty articulating an understanding of both decoding and comprehension strategies.

On the basis of the assessment data gathered through the focus groups, the Guided Reading program at L. B. Primary School is not very effective in promoting student growth in reading skills.

Conclusion

Based on the findings presented above, that the Guided Reading Program is not being implemented with fidelity, and some students who participated in the Guided Reading program demonstrated improvement in reading skills over the year and some did not, we have concluded that the Guided Reading program at L. B. Primary school is not effective.